4 research outputs found

    Model-Based Control of Flying Robots for Robust Interaction under Wind Influence

    Get PDF
    Model-Based Control of Flying Robots for Robust Interaction under Wind Influence The main goal of this thesis is to bridge the gap between trajectory tracking and interaction control for flying robots in order to allow physical interaction under wind influence by making aerial robots aware of the disturbance, interaction, and faults acting on them. This is accomplished by reasoning about the external wrench (force and torque) acting on the robot, and discriminating (distinguishing) between wind, interactions, and collisions. This poses the following research questions. First, is discrimination between the external wrench components even possible in a continuous real-time fashion for control purposes? Second, given the individual wrench components, what are effective control schemes for interaction and trajectory tracking control under wind influence? Third, how can unexpected faults, such as collisions with the environment, be detected and handled efficiently and effectively? In the interest of the first question, a fourth can be posed: is it possible to obtain a measurement of the wind speed that is independent of the external wrench? In this thesis, model-based methods are applied in the pursuit of answers to these questions. This requires a good dynamics model of the robot, as well as accurately identified parameters. Therefore, a systematic parameter identification procedure for aerial robots is developed and applied. Furthermore, external wrench estimation techniques from the field of robot manipulators are extended to be suitable for aerial robots without the need of velocity measurements, which are difficult to obtain in this context. Based on the external wrench estimate, interaction control techniques (impedance and admittance control) are extended and applied to flying robots, and a thorough stability proof is provided. Similarly, the wrench estimate is applied in a geometric trajectory tracking controller to compensate external disturbances, to provide zero steady-state error under wind influence without the need of integral control action. The controllers are finally combined into a novel compensated impedance controller, to facilitate the main goal of the thesis. Collision detection is applied to flying robots, providing a low level reflex reaction that increases safety of these autonomous robots. In order to identify aerodynamic models for wind speed estimation, flight experiments in a three-dimensional wind tunnel were performed using a custom-built hexacopter. This data is used to investigate wind speed estimation using different data-driven aerodynamic models. It is shown that good performance can be obtained using relatively simple linear regression models. In this context, the propeller aerodynamic power model is used to obtain information about wind speed from available motor power measurements. Leveraging the wind tunnel data, it is shown that power can be used to obtain the wind speed. Furthermore, a novel optimization-based method that leverages the propeller aerodynamics model is developed to estimate the wind speed. Essentially, these two methods use the propellers as wind speed sensors, thereby providing an additional measurement independent of the external force. Finally, the novel topic of simultaneously discriminating between aerodynamic, interaction, and fault wrenches is opened up. This enables the implementation of novel types of controllers that are e.g. compliant to physical interaction, while compensating wind disturbances at the same time. The previously unexplored force discrimination topic has the potential to even open a new research avenue for flying robots

    External Wrench Estimation, Collision Detection, and Reflex Reaction for Flying Robots

    No full text
    Flying in unknown environments may lead to unforeseen collisions, which may cause serious damage to the robot and/or its environment. In this context, fast and robust collision detection combined with safe reaction is, therefore, essential and may be achieved using external wrench information. Also, deliberate physical interaction requires a control loop designed for such a purpose and may require knowledge of the contact wrench. In principle, the external wrench may be measured or estimated. Whereas measurement poses large demands on sensor equipment, additional weight, and overall system robustness, in this paper we present a novel model-based method for external wrench estimation in flying robots. The algorithm is based on the onboard inertial measurement unit and the robot's dynamics model only. We design admittance and impedance controllers that use this estimate for sensitive and robust physical interaction. Furthermore, the performance of several collision detection and reaction schemes is investigated in order to ensure collision safety. The identified collision location and associated normal vector located on the robot's convex hull may then be used for sensorless tactile sensing. Finally, a low-level collision reflex layer is provided for flying robots when obstacle avoidance fails, also under wind influence. Our experimental and simulation results show evidence that the methodologies are easily implemented and effective in practice

    External Wrench Estimation, Collision Detection, and Reflex Reaction for Flying Robots

    No full text

    Robust Visual-Inertial State Estimation with Multiple Odometries and Efficient Mapping on an MAV with Ultra-Wide FOV Stereo Vision

    Get PDF
    The here presented flying system uses two pairs of wide-angle stereo cameras and maps a large area of interest in a short amount of time. We present a multicopter system equipped with two pairs of wide-angle stereo cameras and an inertial measurement unit (IMU) for robust visual-inertial navigation and time-efficient omni-directional 3D mapping. The four cameras cover a 240 degree stereo field of view (FOV) vertically, which makes the system also suitable for cramped and confined environments like caves. In our approach, we synthesize eight virtual pinhole cameras from four wide-angle cameras. Each of the resulting four synthesized pinhole stereo systems provides input to an independent visual odometry (VO). Subsequently, the four individual motion estimates are fused with data from an IMU, based on their consistency with the state estimation. We describe the configuration and image processing of the vision system as well as the sensor fusion and mapping pipeline on board the MAV. We demonstrate the robustness of our multi-VO approach for visual-inertial navigation and present results of a 3D-mapping experiment
    corecore